In police face recognition, there are high stakes to accuracy. An accurate algorithm correctly identifies a face in an ATM photo and leads police to a robber’s door. An inaccurate algorithm sends them to the wrong house—and could send an innocent person to jail.175
Face recognition companies understand this, and promise police departments seemingly sky-high accuracy standards. The website of FaceFirst, which uses Cognitec’s algorithm in face recognition software that it sells to police, states that “[w]ith an identification rate above 95% as measured by U.S. government-sponsored Face Recognition Vendor Tests, our technology is the industry’s finest.”176 This is misleading: the 95% figure is a decade old and vastly oversimplifies the nuances of accuracy into a single number from a single test.177 Since 2006, Cognitec’s algorithm has doubtlessly changed dramatically—and the tests have certainly gotten harder too.178
In fact, FaceFirst has made sure that it will not be held to this high standard. A 2015 contract with one of the largest police face recognition systems in the country, the San Diego Association of Governments, includes the following disclaimer: “FaceFirst makes no representations or warranties as to the accuracy and reliability of the product in the performance of its facial recognition capabilities.”179
Compared to fingerprinting, state-of-the-art face recognition is far less reliable and well-tested. Yet other than instructing the recipients of potential face recognition matches that search results are only investigative leads—not conclusive evidence—jurisdictions and other stakeholders take too few steps to protect against false positives and other errors.
- 175. See Simson Garfinkle, Future Tech, Discover, 23.9 (2002): 17–20 (reporting false positive error generated by face recognition technology in use at the Fresno Yosemite International Airport), http://simson.net/clips/2002/2002.Discover.09.FaceID.pdf; Cf. Eric Licthblau, U.S. Will Pay $2 Million to Lawyer Wrongly Jailed, N.Y. Times (Nov. 30, 2006) (describing the case of Brandon Mayfield, who was wrongly linked to the 2004 Madrid train bombings as the result of a faulty fingerprint identification); Office of the Inspector General, U.S. Department of Justice, A Review of the FBI’s Handling of the Brandon Mayfield Case (Jan. 2006) at 1, https://oig.justice.gov/special/s0601/exec.pdf (describing process by which automated fingerprint matching system and FBI human examiner incorrectly matched Mayfield’s prints to Madrid bomber’s).
- 176. FaceFirst, Frequently Asked Questions, http://www.facefirst.com/faq (last visited Sept. 1, 2016).
- 177. See FaceFirst, Frequently Asked Questions, http://www.facefirst.com/faq (last visited Sept. 1, 2016) (archived copy available at https://web.archive.org/web/20160119232512/http://www.facefirst.com/faq and on file with authors) (acknowledging 95% figure is drawn from a 2006 accuracy test).
- 178. See Patrick Grother and Mei Ngan, Face Recognition Vendor Test: Performance of Face Identification Algorithms, NIST Interagency Report 8009 (May 26, 2014), http://biometrics.nist.gov/cs_links/face/frvt/frvt2013/NIST_8009.pdf.
- 179. SANDAG, ARJIS Contract with Facefirst, LLC, Document p. 008358.